ai-memory
If that works, paste this block into the left pane:
ai-memory — Persistent Memory for Any AI
Zero token cost until recall. Built-in memory systems inject 200+ lines into every message. ai-memory uses zero context tokens until memory_recall is
called. Only relevant memories return, ranked by 6-factor scoring. TOON format responses are 79% smaller than JSON.
ICLR 2025 LongMemEval Benchmark
97.8% R@5 (489/500) · 99.8% R@20 (499/500) · 232 q/s · $0 cloud cost
17 MCP Tools
memory_store · memory_recall · memory_search · memory_list · memory_get · memory_update · memory_delete · memory_promote · memory_forget · memory_link · memory_get_links · memory_consolidate · memory_capabilities · memory_expand_query · memory_auto_tag · memory_detect_contradiction · memory_stats
4 Tiers (all local, no cloud)
keyword → semantic (default) → smart (Ollama) → autonomous (Ollama)
8 AI Platforms
Claude · Codex · Gemini · Cursor · Windsurf · Continue.dev · Grok · Llama
Install
cargo install ai-memory
Rust + SQLite + FTS5 + HNSW. 161 tests. MIT licensed.
Server Config
{
"mcpServers": {
"memory": {
"command": "ai-memory",
"args": [
"--db",
"~/.claude/ai-memory.db",
"mcp",
"--tier",
"semantic"
]
}
}
}